One of the most important problems of data transmission in packet networks, in particular\nin wireless sensor networks, are periodic overflows of buffers accumulating packets directed to a\ngiven node. In the case of a buffer overflow, all new incoming packets are lost until the overflow\ncondition terminates. From the point of view of network optimization, it is very important to know the\nprobabilistic nature of this phenomenon, including the probability distribution of the duration of the\nbuffer overflow period. In this article, a mathematical model of the node of a wireless sensor network\nwith discrete time parameter is proposed. The model is governed by a finite-buffer discrete-time\nqueueing system with geometrically distributed interarrival times and general distribution of\nprocessing times. A system of equations for the tail cumulative distribution function of the first\nbuffer overflow period duration conditioned by the initial state of the accumulating buffer is\nderived. The solution of the corresponding system written for probability generating functions\nis found using the analytical approach based on the idea of embedded Markov chain and linear\nalgebra. Corresponding result for next buffer overflow periods is obtained as well. Numerical study\nillustrating theoretical results is attached.
Loading....